Events
When you use Sequin, Postgres contains all your API data at rest. In contrast, events are your data in motion. You can subscribe to events in order to trigger workflows or upsert data into your application tables. You can even use events to sync data to MySQL or other destinations not yet supported by Sequin
Sequin’s events are powered by Sequin-hosted serverless Kafka clusters. You have three options for consuming events:
- Pull: You can connect to your Kafka instance directly with a Kafka client library.
- Pull: You can use the REST API to read events from your Kafka instance.
- Push: You can add an HTTP sink (or webhooks) to have events POST’d to any destination via HTTP.
Stream and events
Sequin provisions one Kafka cluster per account. Every time a record is inserted, updated, or deleted from the API, Sequin publishes an event to Kafka. Events are published into different topics. Topics follow the convention:
Some example subjects for a sync with the id sync_1a107d79
:
Example events
An inserted event looks like this:
The event is a JSON object that contains the following properties:
collection
: a unique slug assigned by Sequin. This is the name of the API object, like a Stripesubscription
or a Salesforcecontact
.created_at
: The time the event was created (and inserted into the event stream).data
: All the fields of the API object. Fields are transformed to match their representation in your database. For example, if you’ve mapped the fieldLastName
tolast_name
in your database, it will belast_name
in the event as well.event
: Eitherinserted
,updated
, ordeleted
.id
: The ID of the record, as assigned by the API.task
: The name of the destination table for this collection in your database.
A deleted event looks like this:
data
in a deleted event contains the entire deleted record.
An updated event looks like this:
Unlike inserted and deleted events, updated events have a changes
property. changes
contains the prior values of all columns that were changed. In the example above, activity_date
was changed from "2023-08-30"
to "2023-09-12"
. Because description
was not changed, it does not appear in the changes
object.
Authentication
Find instructions for connecting in the “Connection Instructions” tab on the Sequin dashboard.
You’ll use a username and a password to connect. You can use those credentials in your Kafka client or with the REST API (which uses HTTP basic authentication).
Connecting with a Kafka client
To connect to the Kafka cluster, you can choose any Kafka client you like. On the “Connection Instructions” tab on the Sequin dashboard, you’ll find code snippets for getting started that include your credentials.
REST API
The REST API is a great option if you’re just getting started, or have a lightweight use case. Under the hood, Sequin uses a serverless Kafka instance from Upstash. Upstash provides two different APIs for consuming events:
The Fetch API is a pagineateable stream of events. It’s the simplest way to consume events. You manage your offset (where in the stream you are). You then make a GET
or POST
request to fetch a page of events.
There’s just one endpoint:
GET /fetch
Here’s an example:
With this approach, you don’t need to acknowledge messages. Your app manages the offset, and so will simply increment the offset to continue paginating the stream.
The Consume API is more full-featured. It uses Kafka’s consumer group mechanism and so more closely matches the standard Kafka protocol. There are two variants:
GET /consume/{{consumer_group}}/{{instance_id}}/{{topic}}?timeout={{timeout}}
[GET | POST] /consume/{{consumer_group}}/${{instance_id}}
consumer_group
is the name of the consumer group while instance_id
is used to identify Kafka consumer instances in the same consumer group.
Here’s an example of consuming from a single topic:
Both APIs are authenticated with HTTP basic authentication.
To get started with the REST API, see the Upstash docs on the Fetch and Consume APIs.
HTTP sink (webhooks)
Webhooks are currently in private alpha. If you need help configuring webhooks, please contact us.
Instead of pulling events from your Kafka cluster, you can have them pushed to you via HTTP (also called a webhook). You can configure the destination HTTP endpoint as well as which topics you want sent via webhook in the Sequin console:
Webhooks will be retried with a back off policy to ensure delivery.