Sequin’s events are powered by Sequin-hosted serverless Kafka clusters. You have three options for consuming events:

  • Pull: You can connect to your Kafka instance directly with a Kafka client library.
  • Pull: You can use the REST API to read events from your Kafka instance.
  • Push: You can add an HTTP sink (or webhooks) to have events POST’d to any destination via HTTP.

Stream and events

Sequin provisions one Kafka cluster per account. Every time a record is inserted, updated, or deleted from the API, Sequin publishes an event to Kafka. Events are published into different topics. Topics follow the convention:

sequin.[sync_id].[collection]

Some example subjects for a sync with the id sync_1a107d79:

sequin.sync_1a107d79.contact
sequin.sync_1a107d79.deal
sequin.sync_1a107d79.invoice_item

Example events

An inserted event looks like this:

{
    "collection" : "task",
    "created_at": "2024-11-10T18:39:00.070453Z",
    "data": {
      “activity_date”:2023-09-12,
      "description" : "task description [ … ] ",
      // …
    },
    "event": "inserted",
    "id":"079013db-8b17-44cd-8528-f5e68fc61333",
    "table_db_name": "task"
 }

The event is a JSON object that contains the following properties:

  • collection: a unique slug assigned by Sequin. This is the name of the API object, like a Stripe subscription or a Salesforce contact.
  • created_at: The time the event was created (and inserted into the event stream).
  • data: All the fields of the API object. Fields are transformed to match their representation in your database. For example, if you’ve mapped the field LastName to last_name in your database, it will be last_name in the event as well.
  • event: Either inserted, updated, or deleted.
  • id: The ID of the record, as assigned by the API.
  • task: The name of the destination table for this collection in your database.

A deleted event looks like this:

{
    "collection" : "task",
    "created_at": "2024-11-10T18:39:00.070453Z",
    "data": {
      “activity_date”:2023-09-12,
      "description" : "task description [ … ] ",
      // …
    },
    "event": "deleted",
    "id":"079013db-8b17-44cd-8528-f5e68fc61333",
    "table_db_name": "task"
 }

data in a deleted event contains the entire deleted record.

An updated event looks like this:

{
    "changes": {
      "activity_date": "2023-08-30"
    },
    "collection" : "task",
    "created_at": "2024-11-10T18:39:00.070453Z",
    "data": {
      “activity_date”:2023-09-12,
      "description" : "task description [ … ] ",
      // …
    },
    "event": "updated",
    "id":"079013db-8b17-44cd-8528-f5e68fc61333",
    "table_db_name": "task"
 }

Unlike inserted and deleted events, updated events have a changes property. changes contains the prior values of all columns that were changed. In the example above, activity_date was changed from "2023-08-30" to "2023-09-12". Because description was not changed, it does not appear in the changes object.

Authentication

Find instructions for connecting in the “Connection Instructions” tab on the Sequin dashboard.

You’ll use a username and a password to connect. You can use those credentials in your Kafka client or with the REST API (which uses HTTP basic authentication).

Connecting with a Kafka client

To connect to the Kafka cluster, you can choose any Kafka client you like. On the “Connection Instructions” tab on the Sequin dashboard, you’ll find code snippets for getting started that include your credentials.

REST API

The REST API is a great option if you’re just getting started, or have a lightweight use case. Under the hood, Sequin uses a serverless Kafka instance from Upstash. Upstash provides two different APIs for consuming events:

The Fetch API is a pagineateable stream of events. It’s the simplest way to consume events. You manage your offset (where in the stream you are). You then make a GET or POST request to fetch a page of events.

There’s just one endpoint:

  • GET /fetch

Here’s an example:

curl https://stingray-271.sequin.io/fetch -u myuser:mypass \
    -d '{"topic": "greetings", "partition": 3, "offset": 11, "timeout": 1000}'

With this approach, you don’t need to acknowledge messages. Your app manages the offset, and so will simply increment the offset to continue paginating the stream.

The Consume API is more full-featured. It uses Kafka’s consumer group mechanism and so more closely matches the standard Kafka protocol. There are two variants:

  • GET /consume/{{consumer_group}}/{{instance_id}}/{{topic}}?timeout={{timeout}}
  • [GET | POST] /consume/{{consumer_group}}/${{instance_id}}

consumer_group is the name of the consumer group while instance_id is used to identify Kafka consumer instances in the same consumer group.

Here’s an example of consuming from a single topic:

curl https://stingray-271.sequin.io/consume/mygroup/myconsumer/greetings -u myuser:mypass

Both APIs are authenticated with HTTP basic authentication.

To get started with the REST API, see the Upstash docs on the Fetch and Consume APIs.

HTTP sink (webhooks)

Webhooks are currently in private alpha. If you need help configuring webhooks, please contact us.

Instead of pulling events from your Kafka cluster, you can have them pushed to you via HTTP (also called a webhook). You can configure the destination HTTP endpoint as well as which topics you want sent via webhook in the Sequin console:

Webhooks will be retried with a back off policy to ensure delivery.