Skip to main content

Events Streaming

Event listener

An event listener is a component that listens for specific events that occur within the IDP server and then takes action based on those events. These events can include user activities, such as logins, logouts, or password changes, as well as admin actions like user creation or role assignment.

Use cases

Use Case 1: User Deletion and Device Data Removal

Scenario

When a user deletion is triggered, the system ensures all related device information is deleted from the AST services.x

Process

  1. The deletion of a user is initiated within the system.
  2. The event listener captures the user deletion event.
  3. When a user is deleted, the event listener notifies the system to remove the associated device data by calling the relevant APIs. AST services proceed to remove all associated device details from the system.

Outcome

The user's profile and all related device data are successfully deleted, ensuring no traces of the user's presence remain within AST services. This is handled efficiently through the event listener and AST services interaction.

Usecase-2: Events Streaming

Scenario

When an event is triggered, the system needs to capture and stream the event data for further processing and analytics.

Process

  1. The Kobil-audit-event listener captures the triggered event and makes an API call to the IDP Scheduler at the endpoint /jobs/kobil-audit-events/queue.
  2. The IDP Scheduler, running on top of JobRunr, collects the event data and streams it to Kafka through topics com.kobil.audit for user actions and com.kobil.audit.admin for admin actions.
  3. Consumers can listen to the specific topic and retrieve the data.

Outcome

The system successfully captures and streams event data to Kafka, where it is published to relevant topics.

note
  • Kafka is a distributed streaming platform that acts as a message queue broker, allowing IDP to publish events as messages through Topics.

  • SmartDashboard is one of the consumers currently employed, receiving streamed events to do analytics through Elasticsearch using data produced by Kafka.

  • During the tenant creation process, both the kobil-event and kobil-audit-event-stream listeners are added to the tenant by default. These listeners are responsible for listening to User and Admin events and taking follow-up actions.

Execution Flow

1. Event Triggering in Keycloak

Events are triggered in the IDP whenever a significant action occurs, such as a user logging in, logging out, registering a new account, or an administrator modifying user roles or settings. Each event includes details about the action, such as the user involved, the time of the event, and the type of action performed.

Eventslog Eventslog

2. IDP Scheduler

The IDP Scheduler, running on JobRunr, initiates event streaming when the IDP sends event data to the IDP Scheduler via an API call to /jobs/kobil-audit-events/queue, starting the streaming process and enabling background processing.

In the event of an API call failure during the streaming process, the IDP stores the failed API call. The IDP uses a scheduler to handle the failures by employing the Failed Kobil Audit Event Scheduler.

3. Failed Kobil Audit Event Scheduler

If an API call fails during the streaming process, the IDP stores the failed call. The IDP then manages the failure using the Failed Kobil Audit Event Scheduler, which runs periodically. This scheduler retries event streaming at regular intervals when it detects a failure, ensuring that data streaming remains consistent even if the initial attempt fails.

  1. Admin events are saved in the FAILED_ADMIN_EVENT_ENTITY table.

  2. User events are saved in the FAILED_EVENT_ENTITY table.

4. Transmission to Kafka

After processing by IDP Scheduler, the streamed event data is transmitted to Kafka. Kafka ensures that these messages are delivered reliably to various consumers.

Event listener