Description
After configuring the default PagerDuty notification channel and a monitor that triggers a notification to be sent through that channel, it does not seem to be sending any information. The logs in the indexer show errors at the exact moment when notifications should be sent.
Steps to reproduce
Configuration
The PagerDuty channel has been configured as follows:
Monitor configuration:
The monitor is configured to send a notification every time a new FIM finding is generated. If we create a new test file in a directory monitored by FIM, a new finding is generated, and the monitor triggers correctly, attempting to send the notification via the PagerDuty channel:
But in this moment, we receive this error logs in the /var/log/wazuh-indexer/wazuh-cluster.log file:
[2026-04-13T15:04:25,730][ERROR][o.o.a.u.DocLevelMonitorQueries] [wazuh-indexer] MapperParsingException[failed to parse]; nested: QueryShardException[No field mapping can be found for the field with name [threat.indicator.type]];
[2026-04-13T15:04:25,730][ERROR][o.o.a.u.DocLevelMonitorQueries] [wazuh-indexer] MapperParsingException[failed to parse]; nested: QueryShardException[No field mapping can be found for the field with name [observer.ingress.interface.name]];
[2026-04-13T15:04:25,730][ERROR][o.o.a.u.DocLevelMonitorQueries] [wazuh-indexer] MapperParsingException[failed to parse]; nested: QueryShardException[No field mapping can be found for the field with name [file.owner]];
[2026-04-13T15:04:25,752][ERROR][o.o.s.c.JoinEngine ] [wazuh-indexer] [CORRELATIONS] Exception encountered while searching correlation rule index for finding id a3060d0e-6578-4c96-8318-3f3e42bb968d
[2026-04-13T15:04:25,754][ERROR][o.o.s.t.TransportCorrelateFindingAction] [wazuh-indexer] Exception occurred while processing correlations for monitor id Y8TVhZ0BMkr02xv7qUPX and finding id a3060d0e-6578-4c96-8318-3f3e42bb968d
[2026-04-13T15:04:25,754][ERROR][o.o.s.u.SecurityAnalyticsException] [wazuh-indexer] Security Analytics error: Cannot invoke "Object.toString()" because the return value of "java.util.Map.get(Object)" is null
[2026-04-13T15:04:38,954][ERROR][o.o.n.c.t.WebhookDestinationTransport] [wazuh-indexer] Exception sending webhook message VsQvhp0BMkr02xv77E6O: org.opensearch.notifications.spi.model.MessageContent@29b208f2
java.io.IOException: Failed: Bad Request
at org.opensearch.notifications.core.client.DestinationHttpClient.validateResponseStatus(DestinationHttpClient.kt:160)
at org.opensearch.notifications.core.client.DestinationHttpClient.execute(DestinationHttpClient.kt:106)
at org.opensearch.notifications.core.transport.WebhookDestinationTransport.sendMessage(WebhookDestinationTransport.kt:41)
at org.opensearch.notifications.core.transport.WebhookDestinationTransport.sendMessage(WebhookDestinationTransport.kt:21)
at org.opensearch.notifications.core.NotificationCoreImpl.sendMessage$lambda$0(NotificationCoreImpl.kt:38)
at java.base/java.security.AccessController.doPrivileged(AccessController.java:74)
at org.opensearch.notifications.core.NotificationCoreImpl.sendMessage(NotificationCoreImpl.kt:35)
at org.opensearch.notifications.send.SendMessageActionHelper.sendMessageThroughSpi(SendMessageActionHelper.kt:692)
at org.opensearch.notifications.send.SendMessageActionHelper.sendWebhookMessage(SendMessageActionHelper.kt:516)
at org.opensearch.notifications.send.SendMessageActionHelper.sendMessageToChannel(SendMessageActionHelper.kt:237)
at org.opensearch.notifications.send.SendMessageActionHelper.access$sendMessageToChannel(SendMessageActionHelper.kt:64)
at org.opensearch.notifications.send.SendMessageActionHelper$sendMessagesInParallel$1$statusDeferredList$1$1.invokeSuspend(SendMessageActionHelper.kt:163)
at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:34)
at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:106)
at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:571)
at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:750)
at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:678)
at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:665
[2026-04-13T15:04:38,955][WARN ][o.o.n.a.PluginBaseAction ] [wazuh-indexer] notifications:OpenSearchStatusException:
org.opensearch.OpenSearchStatusException: {"event_status_list": [{"config_id":"VsQvhp0BMkr02xv77E6O","config_type":"webhook","config_name":"PagerDuty E2E channel","email_recipient_status":[],"delivery_status":{"status_code":"500","status_text":"Failed to send webhook message Failed: Bad Request"}}]}
at org.opensearch.notifications.send.SendMessageActionHelper.executeRequest(SendMessageActionHelper.kt:104)
at org.opensearch.notifications.send.SendMessageActionHelper$executeRequest$1.invokeSuspend(SendMessageActionHelper.kt)
at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:34)
at kotlinx.coroutines.internal.ScopeCoroutine.afterResume(Scopes.kt:32)
at kotlinx.coroutines.AbstractCoroutine.resumeWith(AbstractCoroutine.kt:113)
at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:47)
at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:106)
at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:571)
at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:750)
at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:678)
at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:665)
[2026-04-13T15:04:38,969][INFO ][o.o.a.t.TransportDocLevelMonitorFanOutAction] [wazuh-indexer]
[2026-04-13T15:04:38,970][INFO ][o.o.a.t.TransportDocLevelMonitorFanOutAction] [wazuh-indexer] Completed fan_out for doc level monitor P8eQhp0Bqf3p0D99oOA3 in 422 ms. ExecutionId: P8eQhp0Bqf3p0D99oOA3_2026-04-13T15:04:38.389718308_afc16600-8efc-4f6a-9e4c-803e7e2e2e87
[2026-04-13T15:04:38,954][ERROR][o.o.n.c.t.WebhookDestinationTransport] [wazuh-indexer] Exception sending webhook message VsQvhp0BMkr02xv77E6O: org.opensearch.notifications.spi.model.MessageContent@29b208f2
[2026-04-13T15:05:15,609][ERROR][o.o.s.t.TransportCorrelateFindingAction] [wazuh-indexer] Exception occurred while processing correlations for monitor id y8Qmhp0BMkr02xv7_U1m and finding id 384e3672-ba6e-4997-b312-de7f16dfad34
[2026-04-13T15:05:15,710][ERROR][o.o.n.c.t.WebhookDestinationTransport] [wazuh-indexer] Exception sending webhook message default_pagerduty_channel: org.opensearch.notifications.spi.model.MessageContent@71346f07
And no notifications ever arrive at the configured PagerDuty account.
Note: This also happens if we click on the “Send test message” button that appears on the monitor configuration, and the message is not sent.
Description
After configuring the default PagerDuty notification channel and a monitor that triggers a notification to be sent through that channel, it does not seem to be sending any information. The logs in the indexer show errors at the exact moment when notifications should be sent.
Steps to reproduce
Configuration
The PagerDuty channel has been configured as follows:
Monitor configuration:
The monitor is configured to send a notification every time a new FIM finding is generated. If we create a new test file in a directory monitored by FIM, a new finding is generated, and the monitor triggers correctly, attempting to send the notification via the PagerDuty channel:
But in this moment, we receive this error logs in the
/var/log/wazuh-indexer/wazuh-cluster.logfile:And no notifications ever arrive at the configured PagerDuty account.
Note: This also happens if we click on the “Send test message” button that appears on the monitor configuration, and the message is not sent.