Mosaic enables you to feed events into your SIEM (Security information and event management) and log collection systems. To provide better control over the granularity of data attributes sent to downstream systems, Mosaic allows creating event streams, including multiple streams per event source, and collecting events separately. To learn more, see About event streaming.
To feed data, Mosaic uses Confluent HTTP Source Connector to fetch data from Mosaic Event streaming API endpoint and write it to a Kafka topic.
The connector polls the specified HTTP endpoint at regular intervals and writes the response to a Kafka topic. Learn more at HTTP Source Connector.
In your Mosaic tenant, configure a management app. Give the app a suitable name, for example, KafkaLogStream.
After saving the management app, open it again and note the Client ID and Client Secret values. You’ll need these parameters to configure the Connector.
Before you can start feeding events to Kafka Connect, you have to create event streams in Mosaic. You can create as many event streams as needed.
- In the Admin Portal, navigate to Events streaming and select Create stream.
- Complete the stream configuration by providing the stream identifier, event type to collect, and batch size. Set the stream destination to Kafka Connect. For more details, see About events streaming.
- Obtain the request URL by clicking
next to the stream name and then Copy URL.
The Connector has to be configured to obtain events from Collect events endpoint. Update the activities-to-kafka-connector.json configuration file as described below:
| Setting | Description |
|---|---|
| URL Configuration | The URL points to the Mosaic Event streaming API endpoint: "url": "https://api.transmitsecurity.io/activities/v1/activities/collect?type=<type>&stream_id=<stream_id>" Provide parameters:
|
| Authentication | The connector uses OAuth2 authentication. More details: HTTP Source Connector Authentication configuration. Provide client ID and client secret of the Mosaic management app (see Step 1).
|
| Additional settings |
|
To connect to your existing Kafka deployment, run the Connector using Docker. Use the confluentinc/kafka-connect-http-source:latest image.
- Install the http-source plugin:
confluent-hub install --no-prompt confluentinc/kafka-connect-http-source:latest &&
confluent-hub install --no-prompt confluentinc/kafka-connect-http:latest &&- Launch the Kafka Connect worker:
/etc/confluent/docker/run &- Wait for initialization and then install the
activities-to-kafka-connector:
curl -s -X POST
-H "Content-Type: application/json"
--data @/tmp/activities-to-kafka-connector.json http://localhost:8083/connectors &&You should be able to receive data at this stage.
To test the event streaming locally, make sure your setup includes:
- A docker container for running the activities-connect connector
- Kafka and Zookeeper services
- A Kafka consumer for printing the streamed data
Before running, make sure to update the following values:
"url": "\<EVENT\_STREAM\_URL\>""oauth2.client.id": "\<CLIENT\_ID\>","oauth2.client.secret": "\<CLIENT\_SECRET\>"
To check the status of the connector, run the following docker:
docker exec activities-connect curl http://localhost:8083/connectors/activities-connector/status