Our platform enables you to feed events into your SIEM (Security information and event management) and log collection systems. To provide better control over the granularity of data attributes sent to downstream systems, Mosaic allows creating event streams, including multiple streams per product area, and collecting events separately, for example, feeding user events to Splunk and verification events to GCS.
For the complete event list, see About Mosaic activity events.
To feed events into your systems, you can take advantage of the following:
- API polling techniques
- Third-party system integration:
- Prebuilt plugins for third-party systems:
- Connector for Splunk
- Connector for Microsoft Sentinel
- More plugins under development
Prebuilt plugins & integrations are normally better optimized for the source system (Mosaic) and the consumers. There are multiple patterns a plugin can use to share data with target systems: API polling, pub/sub, and so on.
You can manage streams from the Events streaming page:
- Create a stream by clicking + Add stream and configuring its settings. Alternatively, use Event streaming APIs.
- Delete a stream you no longer need by clicking
and then Delete stream.
- Stream identifier: Provide a user-friendly name for a stream.
- Event type: Select events you want to collect:
| Event source | Description |
|---|---|
| Customer identity | Authentication, user management, and Journey events, such as when a user is created or a journey completes successfully. |
| Fraud Prevention | Fraud Prevention events, such as triggering an action event or getting a recommendation. |
| Identity Verification | Identity Verification events, such as document verification and face authentication. |
| Admin activities | Administrative actions and changes to portal settings, such as adding admins or modifying a tenant. |
- Stream destination: Select of the available plugins or integrations to feed your data, for example, as Kafka or Splunk. Otherwise, select custom to implement API polling.
- Events per batch: Choose a value between 1 and 1000. When configuring the batch size, think about the approximate number of events per polling interval you expect and find the right balance between the batch size and frequency of collection.
- Include data enrichment: Enriches audit logs with human-readable values—such as user emails, application names, and admin email addresses and role groups. Enabling this may impact stream performance.
If you configured a stream for a custom stream destination, run the API call to collect events. Obtain the request URL by clicking next to the stream name and then Copy URL. For example,
https://api.transmitsecurity.io/activities/v1/activities/collect?type=risk&stream_id=12345&batch_size=50.
You can set up a cron job or a custom scheduler to trigger the collection endpoint—and collect events—as often as you need. Consider setting a longer polling period for rare events and a shorter polling period for frequent events. Stop collecting events and delete streams that are no longer needed. Note that Collect events API calls have to be authorized by the management app's client access token.
import fetch from 'node-fetch';
async function run() {
const query = new URLSearchParams({
type: '<TYPE>', // Event source. One of cis, admin, risk, verify
stream_id: 'string', // Unique stream ID
batch_size: '100' // The number of events to return in each batch
}).toString();
const resp = await fetch(
`https://api.transmitsecurity.io/activities/v1/activities/collect?${query}`,
{
method: 'POST',
headers: {
Authorization: 'Bearer <YOUR_TOKEN_HERE>' // Management app's client access token
}
}
);
const data = await resp.text();
console.log(data);
}
run();