About events streaming

Our platform enables you to feed events into your SIEM (Security information and event management) and log collection systems. To provide better control over the granularity of data attributes sent to downstream systems, Mosaic allows creating event streams, including multiple streams per event type, and collecting events separately, for example, feeding user events to Splunk and verification events to GCS.

Event type Description
cis Authentication and user management events, such as when a user is created or logs in to an app. See View user activity.
admin Administrative actions and changes to portal settings, such as adding admins or modifying a tenant. See View admin activity.
risk Detection and response events, such as triggering an action event or getting a recommendation.
verify Identity verification events, such as document verification and face authentication.

Integration options

To feed events into your systems, you can take advantage of the following:

Prebuilt plugins are normally better optimized for the source system (Transmit's platform) and the consumers.

Prebuilt plugins

There are multiple patterns a plugin can use to share data with target systems: API polling, pub/sub, and so on. No matter which pattern is implemented, a prebuilt plugin is easier to set up and configure than starting from scratch.

API polling

You can set up a cron job or a custom scheduler and query Event streaming APIs at specified intervals. When configuring the batch size and polling interval, think about the approximate number of events. Consider setting a longer polling period for rare events and a shorter polling period for frequent events. Keep in mind that determining the right batch size and polling interval for your organization may require some trial and error.

Setting up event collection consists of several steps:

  1. Enabling event collection. Specify the type of events you want to collect.
Example
  • To collect DRS events, use /activities/start-collect?type=risk
  • To collect admin events, use /activities/start-collect?type=admin
Copy
Copied
import fetch from 'node-fetch';

async function run() {
  const query = new URLSearchParams({
    type: '<TYPE>', // Event type. One of cis, admin, risk, verify
  }).toString();

  const resp = await fetch(
    `https://api.transmitsecurity.io/activities/v1/activities/start-collect?${query}`,
    {
      method: 'PUT',
      headers: {
        Authorization: 'Bearer <YOUR_TOKEN_HERE>' // Client access token
      }
    }
  );

  const data = await resp.text();
  console.log(data);
}

run();
  1. Creating an event stream. Make sure to provide a ID to identify the stream. The stream ID should be a continuous string, without spaces, and unique for each stream.
Copy
Copied
import fetch from 'node-fetch';

async function run() {
  const query = new URLSearchParams({
    type: '<TYPE>', // Event type. One of cis, admin, risk, verify
    stream_id: 'string' // Unique stream ID, without spaces
  }).toString();

  const resp = await fetch(
    `https://api.transmitsecurity.io/activities/v1/activities/stream?${query}`,
    {
      method: 'PUT',
      headers: {
        Authorization: 'Bearer <YOUR_TOKEN_HERE>' // Client access token
      }
    }
  );

  const data = await resp.text();
  console.log(data);
}

run();
  1. Collecting events of a specific type. Specify the batch size of events.
Copy
Copied
import fetch from 'node-fetch';

async function run() {
  const query = new URLSearchParams({
    type: '<TYPE>', // Event type. One of cis, admin, risk, verify
    stream_id: 'string', // Unique stream ID
    batch_size: '100' // The number of events to return in each batch
  }).toString();

  const resp = await fetch(
    `https://api.transmitsecurity.io/activities/v1/activities/collect?${query}`,
    {
      method: 'POST',
      headers: {
        Authorization: 'Bearer <YOUR_TOKEN_HERE>'  // Client access token
      }
    }
  );

  const data = await resp.text();
  console.log(data);
}

run();
  1. Stop collecting events and delete streams that are no longer needed.

Migrating from old activities APIs?

If you have previously collected events using /activities/user or /activities/admin endpoints, follow these recommendations to ensure seamless transition to the updated event streaming APIs. New event streaming APIs provide a better control over collected data and more flexibility.

Deprecation notice

The /activities/user or /activities/admin endpoints will be deprecated soon.

  1. Start with enabling data collection and creating a new stream as described above .
  2. Continue monitoring using /activities/user and /activities/admin during the initial phase to avoid missing events. After approximately 15 minutes from creating a new stream, you can start collecting events using the new API.
  3. Once data is consistently collected via the new API, stop using the old API to minimize duplicate events.
  4. Expect duplicates initially due to simultaneous use of both APIs. To reduce duplicates, after 15 minutes of setting up the new stream, prioritize using the new API for data collection.
  5. To prevent data loss, stop using the deprecated endpoints, only AFTER you've configured a new solutions and tested it. Verify incoming data through the new API before fully phasing out the old API.
  6. Make sure to update Splunk or Sentinel connectors if you configured them to feed events to these systems.
  7. Note that you can retrieve events collected by deprecated endpoints until the events are deleted by retention.