- Boot Sequin
- Connect to a sample playground database
- Configure a Kafka topic to receive database changes
- See your changes flow in real-time
Boot Kafka (optional)
If you don’t already have Kafka running, start Kafka with Docker Compose:
- Download our Docker compose file for Kafka (right click, save link as…).
- Move it to a new directory, navigate to that directory, and start the services:
Alternative: Download with curl
Alternative: Download with curl
Alternative: Copy/paste raw Kafka docker-compose.yaml
Alternative: Copy/paste raw Kafka docker-compose.yaml
If you’re using another Kafka instance, ensure you have the connection details ready.
Create the Kafka topic
Next, create the Kafka topic that will receive our database changes. Create this using the Kafka container you just started:You should see output confirming the topic was created:
Run Sequin
The easiest way to get started with Sequin is with our Docker Compose file. This file starts a Postgres database, Redis instance, and Sequin server.
Create directory and start services
- Download sequin-docker-compose.zip.
- Unzip the file.
- Navigate to the unzipped directory and start the services:
Alternative: Download with curl
Alternative: Download with curl
Alternative: Clone the repository
Alternative: Clone the repository
Login
The Docker Compose file automatically configures Sequin with an admin user and a playground database.Let’s log in to the Sequin web console:
Open the web console
After starting the Docker Compose services, open the Sequin web console at http://localhost:7376:

View the playground database
To get you started quickly, Sequin’s Docker Compose file creates a logical database called
sequin_playground with a sample dataset in the public.products table.Let’s take a look:Select playground database
Click on the pre-configured 
sequin-playground database:
The database “Health” should be green.
Create a Kafka Sink
With the playground database connected, you can create a sink. This sink will send changes to the
products table to your Kafka topic:Note "Source" configuration
In the “Source” card, note that the 
sequin-playground database is selected and all schemas and tables are included. Leave these defaults:
Setup a backfill
In the 
Inital backfill card, select the public.products table to initate a backfill when the sink is created.
Configure "Kafka Configuration"
In the “Kafka Configuration” card, enter your Kafka connection details:
- Hosts: If running locally with the provided docker-compose, use
host.docker.internal:9092 - Topic: The Kafka topic to stream to (e.g.,
products) - SASL Mechanism: Select if your Kafka cluster requires authentication
- Username/Password: Required if SASL is enabled
- TLS: Toggle on if your Kafka cluster requires TLS

Test the connection
At the bottom of the form, click the “Test Connection” button. If you provided proper credentials, it should succeed.
Sequin can connect to your Kafka cluster.
See changes flow to your Kafka topic
On the new sink’s overview page, you should see the “Health” status turn green, indicating data is flowing to your Kafka topic.Let’s confirm messages are flowing:
Messages tab
Click the “Messages” tab. You’ll see a list of the recently delivered messages:

Sequin indicates it backfilled the
products table to your Kafka topic.View in Kafka CLI
In your terminal, run the following command to read from the topic:You should see the messages that were sent from Sequin. These are
read events from the initial backfill of the products table.Messages are flowing from Sequin to your Kafka topic.
Make some changes
Let’s make some changes to the In another terminal, consume messages from your Kafka topic:You should see a message corresponding to the inserted row.Feel free to try other changes:
Each change will appear in your Kafka topic within a few seconds.
products table and see them flow to your topic.In your terminal, run the following command to insert a new row into the products table:Update a product's price
Update a product's price
Change a product's name
Change a product's name
Delete a product
Delete a product
Great work!
- Set up a complete Postgres change data capture pipeline
- Loaded existing data through a backfill
- Made changes to the
productstable - Verified changes are flowing to your Kafka topic

