How to set up real-time Postgres CDC
Learn how to capture and deliver every insert, update, and delete in your Postgres database to a message queue or stream with Sequin.
This guide serves as an overview of how to use Sequin to create real-time Postgres change data capture (CDC) pipelines.
By the end, you’ll have a complete change data capture pipeline that streams database changes to a message queue or stream in real-time.
We have more detailed guides for each sink destination
Prerequisites
If you’re self-hosting Sequin, you’ll need:
- Sequin installed
- A database connected
- A sink destination (like SQS, Kafka, Redis, or HTTP)
If you’re using Sequin Cloud, you’ll need:
- A Sequin Cloud account
- A database connected
- A sink destination (like SQS, Kafka, Redis, or HTTP)
CDC architecture overview
Your change data capture pipeline will have three components:
- Source table(s): The table(s) in Postgres that you want to capture changes from.
- Filters and transformations: Optional filters and transformations to apply to changes before they’re delivered to your sink.
- Destination sink: The message queue, stream, or webhook endpoint that Sequin delivers changes to (e.g. SQS, Kafka, or HTTP endpoint)
Create a sink
With your Postgres database connected to Sequin, create a sink. Navigate to the Sinks tab and click the Create Sink button. Select the sink destination you want to use.
Select the source
Select the table you want to capture changes from.
Add filters and transformations
You can add SQL filters to capture a subset of the source table.
You can also add transformations to modify the data before it’s delivered to your sink.
Specify backfill
To stream existing roes in your source table, specify a backfill
.
You can backfill all the data in your source table or specify a custom subset of the table using a SQL filter.
Set message grouping
Messages in the same group are delivered in order. By default, Sequin will group messages by primary key.
You can configure a different grouping strategy by selecting custom columns from the source table.
Configure sink-specific settings
Configure sink-specific settings:
- Azure EventHubs sinks: Specify your Azure EventHubs credentials and topic to deliver messages to.
- Google Cloud Pub/Sub sinks: Specify your GCP Pub/Sub credentials and the topic to deliver messages to.
- Kafka sinks: Specify your Kafka credentials and topic to deliver messages to.
- NATS sinks: Specify your NATS credentials and subject to deliver messages to.
- RabbitMQ sinks: Specify your RabbitMQ credentials and queue to deliver messages to.
- Redis sinks: Specify your Redis credentials and key to deliver messages to.
- Sequin Stream sinks: Specify your Sequin Stream to deliver messages to.
- SQS sinks: Specify your SQS queue to deliver messages to.
- Webhook sinks: Specify the HTTP endpoint to deliver messages to.
Start the sink
Click Create Sink to create the sink and start streaming changes.
Verify your CDC pipeline is working
If you specified a backfill, there should be messages in your stream ready for your system to process. To observe these messages, navigate to the sink overview page and click the “Messages” tab. You should see messages flowing to your sink.
If you didn’t specify a backfill, then create a change in your database (e.g. an insert, update, or delete) and verify that a message is delivered on the “Messages” tab.
Next steps
You now have a complete change data capture pipeline that streams Postgres changes to a message queue or stream in real-time.
If you are using Sequin Cloud, you are ready to scale. If you are self-hosting Sequin, see “Deploy to production” for guidance on copying your local sink configuration to your production environment.
Messages reference
Learn about the difference between change and record messages.
Backfills
Learn how to backfill existing data from your source tables.
Filters
Learn about filters and how to use them to filter messages before they are sent to your destination.
Transforms
Learn how to transform messages before they are sent to your destination.